Search Results

Documents authored by De, Anindya


Document
Testing Intersecting and Union-Closed Families

Authors: Xi Chen, Anindya De, Yuhao Li, Shivam Nadimpalli, and Rocco A. Servedio

Published in: LIPIcs, Volume 287, 15th Innovations in Theoretical Computer Science Conference (ITCS 2024)


Abstract
Inspired by the classic problem of Boolean function monotonicity testing, we investigate the testability of other well-studied properties of combinatorial finite set systems, specifically intersecting families and union-closed families. A function f: {0,1}ⁿ → {0,1} is intersecting (respectively, union-closed) if its set of satisfying assignments corresponds to an intersecting family (respectively, a union-closed family) of subsets of [n]. Our main results are that - in sharp contrast with the property of being a monotone set system - the property of being an intersecting set system, and the property of being a union-closed set system, both turn out to be information-theoretically difficult to test. We show that: - For ε ≥ Ω(1/√n), any non-adaptive two-sided ε-tester for intersectingness must make 2^{Ω(n^{1/4}/√{ε})} queries. We also give a 2^{Ω(√{n log(1/ε)})}-query lower bound for non-adaptive one-sided ε-testers for intersectingness. - For ε ≥ 1/2^{Ω(n^{0.49})}, any non-adaptive two-sided ε-tester for union-closedness must make n^{Ω(log(1/ε))} queries. Thus, neither intersectingness nor union-closedness shares the poly(n,1/ε)-query non-adaptive testability that is enjoyed by monotonicity. To complement our lower bounds, we also give a simple poly(n^{√{nlog(1/ε)}},1/ε)-query, one-sided, non-adaptive algorithm for ε-testing each of these properties (intersectingness and union-closedness). We thus achieve nearly tight upper and lower bounds for two-sided testing of intersectingness when ε = Θ(1/√n), and for one-sided testing of intersectingness when ε = Θ(1).

Cite as

Xi Chen, Anindya De, Yuhao Li, Shivam Nadimpalli, and Rocco A. Servedio. Testing Intersecting and Union-Closed Families. In 15th Innovations in Theoretical Computer Science Conference (ITCS 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 287, pp. 33:1-33:23, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@InProceedings{chen_et_al:LIPIcs.ITCS.2024.33,
  author =	{Chen, Xi and De, Anindya and Li, Yuhao and Nadimpalli, Shivam and Servedio, Rocco A.},
  title =	{{Testing Intersecting and Union-Closed Families}},
  booktitle =	{15th Innovations in Theoretical Computer Science Conference (ITCS 2024)},
  pages =	{33:1--33:23},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-309-6},
  ISSN =	{1868-8969},
  year =	{2024},
  volume =	{287},
  editor =	{Guruswami, Venkatesan},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ITCS.2024.33},
  URN =		{urn:nbn:de:0030-drops-195610},
  doi =		{10.4230/LIPIcs.ITCS.2024.33},
  annote =	{Keywords: Sublinear algorithms, property testing, computational complexity, monotonicity, intersecting families, union-closed families}
}
Document
Convex Influences

Authors: Anindya De, Shivam Nadimpalli, and Rocco A. Servedio

Published in: LIPIcs, Volume 215, 13th Innovations in Theoretical Computer Science Conference (ITCS 2022)


Abstract
We introduce a new notion of influence for symmetric convex sets over Gaussian space, which we term "convex influence". We show that this new notion of influence shares many of the familiar properties of influences of variables for monotone Boolean functions f: {±1}ⁿ → {±1}. Our main results for convex influences give Gaussian space analogues of many important results on influences for monotone Boolean functions. These include (robust) characterizations of extremal functions, the Poincaré inequality, the Kahn-Kalai-Linial theorem [J. Kahn et al., 1988], a sharp threshold theorem of Kalai [G. Kalai, 2004], a stability version of the Kruskal-Katona theorem due to O'Donnell and Wimmer [R. O'Donnell and K. Wimmer, 2009], and some partial results towards a Gaussian space analogue of Friedgut’s junta theorem [E. Friedgut, 1998]. The proofs of our results for convex influences use very different techniques than the analogous proofs for Boolean influences over {±1}ⁿ. Taken as a whole, our results extend the emerging analogy between symmetric convex sets in Gaussian space and monotone Boolean functions from {±1}ⁿ to {±1}.

Cite as

Anindya De, Shivam Nadimpalli, and Rocco A. Servedio. Convex Influences. In 13th Innovations in Theoretical Computer Science Conference (ITCS 2022). Leibniz International Proceedings in Informatics (LIPIcs), Volume 215, pp. 53:1-53:21, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2022)


Copy BibTex To Clipboard

@InProceedings{de_et_al:LIPIcs.ITCS.2022.53,
  author =	{De, Anindya and Nadimpalli, Shivam and Servedio, Rocco A.},
  title =	{{Convex Influences}},
  booktitle =	{13th Innovations in Theoretical Computer Science Conference (ITCS 2022)},
  pages =	{53:1--53:21},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-217-4},
  ISSN =	{1868-8969},
  year =	{2022},
  volume =	{215},
  editor =	{Braverman, Mark},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ITCS.2022.53},
  URN =		{urn:nbn:de:0030-drops-156498},
  doi =		{10.4230/LIPIcs.ITCS.2022.53},
  annote =	{Keywords: Fourier analysis of Boolean functions, convex geometry, influences, threshold phenomena}
}
Document
Polynomial-Time Trace Reconstruction in the Low Deletion Rate Regime

Authors: Xi Chen, Anindya De, Chin Ho Lee, Rocco A. Servedio, and Sandip Sinha

Published in: LIPIcs, Volume 185, 12th Innovations in Theoretical Computer Science Conference (ITCS 2021)


Abstract
In the trace reconstruction problem, an unknown source string x ∈ {0,1}ⁿ is transmitted through a probabilistic deletion channel which independently deletes each bit with some fixed probability δ and concatenates the surviving bits, resulting in a trace of x. The problem is to reconstruct x given access to independent traces. Trace reconstruction of arbitrary (worst-case) strings is a challenging problem, with the current state of the art for poly(n)-time algorithms being the 2004 algorithm of Batu et al. [T. Batu et al., 2004]. This algorithm can reconstruct an arbitrary source string x ∈ {0,1}ⁿ in poly(n) time provided that the deletion rate δ satisfies δ ≤ n^{-(1/2 + ε)} for some ε > 0. In this work we improve on the result of [T. Batu et al., 2004] by giving a poly(n)-time algorithm for trace reconstruction for any deletion rate δ ≤ n^{-(1/3 + ε)}. Our algorithm works by alternating an alignment-based procedure, which we show effectively reconstructs portions of the source string that are not "highly repetitive", with a novel procedure that efficiently determines the length of highly repetitive subwords of the source string.

Cite as

Xi Chen, Anindya De, Chin Ho Lee, Rocco A. Servedio, and Sandip Sinha. Polynomial-Time Trace Reconstruction in the Low Deletion Rate Regime. In 12th Innovations in Theoretical Computer Science Conference (ITCS 2021). Leibniz International Proceedings in Informatics (LIPIcs), Volume 185, pp. 20:1-20:20, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2021)


Copy BibTex To Clipboard

@InProceedings{chen_et_al:LIPIcs.ITCS.2021.20,
  author =	{Chen, Xi and De, Anindya and Lee, Chin Ho and Servedio, Rocco A. and Sinha, Sandip},
  title =	{{Polynomial-Time Trace Reconstruction in the Low Deletion Rate Regime}},
  booktitle =	{12th Innovations in Theoretical Computer Science Conference (ITCS 2021)},
  pages =	{20:1--20:20},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-177-1},
  ISSN =	{1868-8969},
  year =	{2021},
  volume =	{185},
  editor =	{Lee, James R.},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ITCS.2021.20},
  URN =		{urn:nbn:de:0030-drops-135595},
  doi =		{10.4230/LIPIcs.ITCS.2021.20},
  annote =	{Keywords: trace reconstruction}
}
Document
Quantitative Correlation Inequalities via Semigroup Interpolation

Authors: Anindya De, Shivam Nadimpalli, and Rocco A. Servedio

Published in: LIPIcs, Volume 185, 12th Innovations in Theoretical Computer Science Conference (ITCS 2021)


Abstract
Most correlation inequalities for high-dimensional functions in the literature, such as the Fortuin-Kasteleyn-Ginibre inequality and the celebrated Gaussian Correlation Inequality of Royen, are qualitative statements which establish that any two functions of a certain type have non-negative correlation. We give a general approach that can be used to bootstrap many qualitative correlation inequalities for functions over product spaces into quantitative statements. The approach combines a new extremal result about power series, proved using complex analysis, with harmonic analysis of functions over product spaces. We instantiate this general approach in several different concrete settings to obtain a range of new and near-optimal quantitative correlation inequalities, including: - A {quantitative} version of Royen’s celebrated Gaussian Correlation Inequality [Royen, 2014]. In [Royen, 2014] Royen confirmed a conjecture, open for 40 years, stating that any two symmetric convex sets must be non-negatively correlated under any centered Gaussian distribution. We give a lower bound on the correlation in terms of the vector of degree-2 Hermite coefficients of the two convex sets, conceptually similar to Talagrand’s quantitative correlation bound for monotone Boolean functions over {0,1}ⁿ [M. Talagrand, 1996]. We show that our quantitative version of Royen’s theorem is within a logarithmic factor of being optimal. - A quantitative version of the well-known FKG inequality for monotone functions over any finite product probability space. This is a broad generalization of Talagrand’s quantitative correlation bound for functions from {0,1}ⁿ to {0,1} under the uniform distribution [M. Talagrand, 1996]; the only prior generalization of which we are aware is due to Keller [Nathan Keller, 2012; Keller, 2008; Nathan Keller, 2009], which extended [M. Talagrand, 1996] to product distributions over {0,1}ⁿ. In the special case of p-biased distributions over {0,1}ⁿ that was considered by Keller, our new bound essentially saves a factor of p log(1/p) over the quantitative bounds given in [Nathan Keller, 2012; Keller, 2008; Nathan Keller, 2009]. We also give {a quantitative version of} the FKG inequality for monotone functions over the continuous domain [0,1]ⁿ, answering a question of Keller [Nathan Keller, 2009].

Cite as

Anindya De, Shivam Nadimpalli, and Rocco A. Servedio. Quantitative Correlation Inequalities via Semigroup Interpolation. In 12th Innovations in Theoretical Computer Science Conference (ITCS 2021). Leibniz International Proceedings in Informatics (LIPIcs), Volume 185, pp. 69:1-69:20, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2021)


Copy BibTex To Clipboard

@InProceedings{de_et_al:LIPIcs.ITCS.2021.69,
  author =	{De, Anindya and Nadimpalli, Shivam and Servedio, Rocco A.},
  title =	{{Quantitative Correlation Inequalities via Semigroup Interpolation}},
  booktitle =	{12th Innovations in Theoretical Computer Science Conference (ITCS 2021)},
  pages =	{69:1--69:20},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-177-1},
  ISSN =	{1868-8969},
  year =	{2021},
  volume =	{185},
  editor =	{Lee, James R.},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ITCS.2021.69},
  URN =		{urn:nbn:de:0030-drops-136081},
  doi =		{10.4230/LIPIcs.ITCS.2021.69},
  annote =	{Keywords: complex analysis, correlation inequality, FKG inequality, Gaussian correlation inequality, harmonic analysis, Markov semigroups}
}
Document
Track A: Algorithms, Complexity and Games
An Efficient PTAS for Stochastic Load Balancing with Poisson Jobs

Authors: Anindya De, Sanjeev Khanna, Huan Li, and Hesam Nikpey

Published in: LIPIcs, Volume 168, 47th International Colloquium on Automata, Languages, and Programming (ICALP 2020)


Abstract
We give the first polynomial-time approximation scheme (PTAS) for the stochastic load balancing problem when the job sizes follow Poisson distributions. This improves upon the 2-approximation algorithm due to Goel and Indyk (FOCS'99). Moreover, our approximation scheme is an efficient PTAS that has a running time double exponential in 1/ε but nearly-linear in n, where n is the number of jobs and ε is the target error. Previously, a PTAS (not efficient) was only known for jobs that obey exponential distributions (Goel and Indyk, FOCS'99). Our algorithm relies on several probabilistic ingredients including some (seemingly) new results on scaling and the so-called "focusing effect" of maximum of Poisson random variables which might be of independent interest.

Cite as

Anindya De, Sanjeev Khanna, Huan Li, and Hesam Nikpey. An Efficient PTAS for Stochastic Load Balancing with Poisson Jobs. In 47th International Colloquium on Automata, Languages, and Programming (ICALP 2020). Leibniz International Proceedings in Informatics (LIPIcs), Volume 168, pp. 37:1-37:18, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2020)


Copy BibTex To Clipboard

@InProceedings{de_et_al:LIPIcs.ICALP.2020.37,
  author =	{De, Anindya and Khanna, Sanjeev and Li, Huan and Nikpey, Hesam},
  title =	{{An Efficient PTAS for Stochastic Load Balancing with Poisson Jobs}},
  booktitle =	{47th International Colloquium on Automata, Languages, and Programming (ICALP 2020)},
  pages =	{37:1--37:18},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-138-2},
  ISSN =	{1868-8969},
  year =	{2020},
  volume =	{168},
  editor =	{Czumaj, Artur and Dawar, Anuj and Merelli, Emanuela},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ICALP.2020.37},
  URN =		{urn:nbn:de:0030-drops-124449},
  doi =		{10.4230/LIPIcs.ICALP.2020.37},
  annote =	{Keywords: Efficient PTAS, Makespan Minimization, Scheduling, Stochastic Load Balancing, Poisson Distribution}
}
Document
Simple and Efficient Pseudorandom Generators from Gaussian Processes

Authors: Eshan Chattopadhyay, Anindya De, and Rocco A. Servedio

Published in: LIPIcs, Volume 137, 34th Computational Complexity Conference (CCC 2019)


Abstract
We show that a very simple pseudorandom generator fools intersections of k linear threshold functions (LTFs) and arbitrary functions of k LTFs over n-dimensional Gaussian space. The two analyses of our PRG (for intersections versus arbitrary functions of LTFs) are quite different from each other and from previous analyses of PRGs for functions of halfspaces. Our analysis for arbitrary functions of LTFs establishes bounds on the Wasserstein distance between Gaussian random vectors with similar covariance matrices, and combines these bounds with a conversion from Wasserstein distance to "union-of-orthants" distance from [Xi Chen et al., 2014]. Our analysis for intersections of LTFs uses extensions of the classical Sudakov-Fernique type inequalities, which give bounds on the difference between the expectations of the maxima of two Gaussian random vectors with similar covariance matrices. For all values of k, our generator has seed length O(log n) + poly(k) for arbitrary functions of k LTFs and O(log n) + poly(log k) for intersections of k LTFs. The best previous result, due to [Gopalan et al., 2010], only gave such PRGs for arbitrary functions of k LTFs when k=O(log log n) and for intersections of k LTFs when k=O((log n)/(log log n)). Thus our PRG achieves an O(log n) seed length for values of k that are exponentially larger than previous work could achieve. By combining our PRG over Gaussian space with an invariance principle for arbitrary functions of LTFs and with a regularity lemma, we obtain a deterministic algorithm that approximately counts satisfying assignments of arbitrary functions of k general LTFs over {0,1}^n in time poly(n) * 2^{poly(k,1/epsilon)} for all values of k. This algorithm has a poly(n) runtime for k =(log n)^c for some absolute constant c>0, while the previous best poly(n)-time algorithms could only handle k = O(log log n). For intersections of LTFs, by combining these tools with a recent PRG due to [R. O'Donnell et al., 2018], we obtain a deterministic algorithm that can approximately count satisfying assignments of intersections of k general LTFs over {0,1}^n in time poly(n) * 2^{poly(log k, 1/epsilon)}. This algorithm has a poly(n) runtime for k =2^{(log n)^c} for some absolute constant c>0, while the previous best poly(n)-time algorithms for intersections of k LTFs, due to [Gopalan et al., 2010], could only handle k=O((log n)/(log log n)).

Cite as

Eshan Chattopadhyay, Anindya De, and Rocco A. Servedio. Simple and Efficient Pseudorandom Generators from Gaussian Processes. In 34th Computational Complexity Conference (CCC 2019). Leibniz International Proceedings in Informatics (LIPIcs), Volume 137, pp. 4:1-4:33, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2019)


Copy BibTex To Clipboard

@InProceedings{chattopadhyay_et_al:LIPIcs.CCC.2019.4,
  author =	{Chattopadhyay, Eshan and De, Anindya and Servedio, Rocco A.},
  title =	{{Simple and Efficient Pseudorandom Generators from Gaussian Processes}},
  booktitle =	{34th Computational Complexity Conference (CCC 2019)},
  pages =	{4:1--4:33},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-116-0},
  ISSN =	{1868-8969},
  year =	{2019},
  volume =	{137},
  editor =	{Shpilka, Amir},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.CCC.2019.4},
  URN =		{urn:nbn:de:0030-drops-108262},
  doi =		{10.4230/LIPIcs.CCC.2019.4},
  annote =	{Keywords: Polynomial threshold functions, Gaussian processes, Johnson-Lindenstrauss, pseudorandom generators}
}
Document
Density Estimation for Shift-Invariant Multidimensional Distributions

Authors: Anindya De, Philip M. Long, and Rocco A. Servedio

Published in: LIPIcs, Volume 124, 10th Innovations in Theoretical Computer Science Conference (ITCS 2019)


Abstract
We study density estimation for classes of shift-invariant distributions over R^d. A multidimensional distribution is "shift-invariant" if, roughly speaking, it is close in total variation distance to a small shift of it in any direction. Shift-invariance relaxes smoothness assumptions commonly used in non-parametric density estimation to allow jump discontinuities. The different classes of distributions that we consider correspond to different rates of tail decay. For each such class we give an efficient algorithm that learns any distribution in the class from independent samples with respect to total variation distance. As a special case of our general result, we show that d-dimensional shift-invariant distributions which satisfy an exponential tail bound can be learned to total variation distance error epsilon using O~_d(1/ epsilon^{d+2}) examples and O~_d(1/ epsilon^{2d+2}) time. This implies that, for constant d, multivariate log-concave distributions can be learned in O~_d(1/epsilon^{2d+2}) time using O~_d(1/epsilon^{d+2}) samples, answering a question of [Diakonikolas et al., 2016]. All of our results extend to a model of noise-tolerant density estimation using Huber's contamination model, in which the target distribution to be learned is a (1-epsilon,epsilon) mixture of some unknown distribution in the class with some other arbitrary and unknown distribution, and the learning algorithm must output a hypothesis distribution with total variation distance error O(epsilon) from the target distribution. We show that our general results are close to best possible by proving a simple Omega (1/epsilon^d) information-theoretic lower bound on sample complexity even for learning bounded distributions that are shift-invariant.

Cite as

Anindya De, Philip M. Long, and Rocco A. Servedio. Density Estimation for Shift-Invariant Multidimensional Distributions. In 10th Innovations in Theoretical Computer Science Conference (ITCS 2019). Leibniz International Proceedings in Informatics (LIPIcs), Volume 124, pp. 28:1-28:20, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2019)


Copy BibTex To Clipboard

@InProceedings{de_et_al:LIPIcs.ITCS.2019.28,
  author =	{De, Anindya and Long, Philip M. and Servedio, Rocco A.},
  title =	{{Density Estimation for Shift-Invariant Multidimensional Distributions}},
  booktitle =	{10th Innovations in Theoretical Computer Science Conference (ITCS 2019)},
  pages =	{28:1--28:20},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-095-8},
  ISSN =	{1868-8969},
  year =	{2019},
  volume =	{124},
  editor =	{Blum, Avrim},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ITCS.2019.28},
  URN =		{urn:nbn:de:0030-drops-101214},
  doi =		{10.4230/LIPIcs.ITCS.2019.28},
  annote =	{Keywords: Density estimation, unsupervised learning, log-concave distributions, non-parametrics}
}
Document
Noise Stability Is Computable and Approximately Low-Dimensional

Authors: Anindya De, Elchanan Mossel, and Joe Neeman

Published in: LIPIcs, Volume 79, 32nd Computational Complexity Conference (CCC 2017)


Abstract
Questions of noise stability play an important role in hardness of approximation in computer science as well as in the theory of voting. In many applications, the goal is to find an optimizer of noise stability among all possible partitions of R^n for n >= 1 to k parts with given Gaussian measures mu_1, ..., mu_k. We call a partition epsilon-optimal, if its noise stability is optimal up to an additive epsilon. In this paper, we give an explicit, computable function n(epsilon) such that an epsilon-optimal partition exists in R^{n(epsilon)}. This result has implications for the computability of certain problems in non-interactive simulation, which are addressed in a subsequent work.

Cite as

Anindya De, Elchanan Mossel, and Joe Neeman. Noise Stability Is Computable and Approximately Low-Dimensional. In 32nd Computational Complexity Conference (CCC 2017). Leibniz International Proceedings in Informatics (LIPIcs), Volume 79, pp. 10:1-10:11, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2017)


Copy BibTex To Clipboard

@InProceedings{de_et_al:LIPIcs.CCC.2017.10,
  author =	{De, Anindya and Mossel, Elchanan and Neeman, Joe},
  title =	{{Noise Stability Is Computable and Approximately Low-Dimensional}},
  booktitle =	{32nd Computational Complexity Conference (CCC 2017)},
  pages =	{10:1--10:11},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-040-8},
  ISSN =	{1868-8969},
  year =	{2017},
  volume =	{79},
  editor =	{O'Donnell, Ryan},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.CCC.2017.10},
  URN =		{urn:nbn:de:0030-drops-75390},
  doi =		{10.4230/LIPIcs.CCC.2017.10},
  annote =	{Keywords: Gaussian noise stability; Plurality is stablest; Ornstein Uhlenbeck operator}
}
Questions / Remarks / Feedback
X

Feedback for Dagstuhl Publishing


Thanks for your feedback!

Feedback submitted

Could not send message

Please try again later or send an E-mail